There were a few tasks scheduled. The very first was to get to know Max 7

Afterwards, we experimented with video and audio effects

After which came a combination of audio and video effects, playing with audio visualisations intervened by user input and, at last, connecting Arduino with a Processing sketch and Max 7




The main challenges primarily consisted of getting to know the application itself, despite it feeling relatively intuitive, there were still many unknowns associated with the application. In addition, the workflow was slow, as some methods like deleting could be made quicker by using hotkeys (in my case alt). Figuring out the logic at times was also tricky, for instance, does the effect filter an audio/video or does it actually attach on top of it? What do the different pipelines do? What sorts of values are acceptable?

Also, some objects didn't make sense to include. For instance, some sort of a flickering button thing, that was necessary to parse data from Processing and ODP/UDP protocol. But it worked, so I shouldn't complain at this point

Splitting the values was also a bit tricky, it seems like the Max 7 likes humans to introduce values to manipulate with stuff within



I successfully managed to complete all of the requested steps and came up with some cool distortions by the inclusion of randomly chosen effects. The application works quite fluently and I was pleased to see many effects working quite nicely

Eventually, I managed to go beyond Firmata and connect Arduino, Processing and Max 7 along with an ultrasonic sensor influencing the visualisations in an effective manner

Most of the steps that I have experimented with have produced satisfactory effects



First of all, Max 7 is buggy. The application crashed a few times, frozen up on me for overextended periods of time (and seldom crashing afterwards), also the audio/video, especially with the camera, likes to freeze up randomly as well or not behave the way I intend it to. Even by the inclusion or detachment of my IEMs, the music would just stop working. Sort of makes sense, but it distorts the experience I had with the application

Moreover, progress loss is unacceptable with professional-grade software and once I see Max crashing, that is something that should be considered as of critical nature and in need of addressing ASAP

Every now and then I would also come across issues that I did not quite fully comprehend. Such as data not flowing by the lack of, from my perspective: details, that well, as I have previously quite mentioned, I did not quite fully comprehend


//Processing Code
import processing.serial.*;
import oscP5.*;
import netP5.*;


OscP5 oscP5;

/* a NetAddress contains the ip address and port number of a remote location in the network. */
NetAddress myBroadcastLocation;

float sendOSCMouseX, sendOSCMouseY;

Serial myPort1;  // Create object from Serial class
Serial myPort2;
Serial myPort3;
String val1;     // Data received from the serial port
String val2;
String val3;

void setup()
{
  size(400,400);
  // I know that the first port in the serial list on my mac
  // is Serial.list()[0].
  // On Windows machines, this generally opens COM1.
  // Open whatever port is the one you're using.
  String portName = Serial.list()[0]; //change the 0 to a 1 or 2 etc. to match your port
  myPort1 = new Serial(this, portName, 115200);
  
  frameRate(25);

  /* create a new instance of oscP5. 
   * 12000 is the port number you are listening for incoming osc messages.
   */
  oscP5 = new OscP5(this, 12000);

  /* create a new NetAddress. a NetAddress is used when sending osc messages
   * with the oscP5.send method.
   */

  /* the address of the osc broadcast server */
  myBroadcastLocation = new NetAddress("127.0.0.1", 32000);
}

void draw()
{
  
    background(0);
  fill(45, 124, 26);
  noStroke();
  sendOSCMouseX = mouseX;
  sendOSCMouseY = mouseY;
  ellipse(sendOSCMouseX, sendOSCMouseY, 50, 50);
  
  OscMessage myOscMessage = new OscMessage("/test");
  myOscMessage.add(sendOSCMouseX/100);
  myOscMessage.add(sendOSCMouseY/100);
  String val;
  val = myPort1.readStringUntil('\n'); 
  val1 = trim(val);
  if (val==null || val=="1"){
  }
  else{
    float val2 = parseFloat(val1);
println(parseInt(val1));
  myOscMessage.add(val2/100);
  
   oscP5.send(myOscMessage, myBroadcastLocation);
  }
}




void mousePressed() {
  /* create a new OscMessage with an address pattern, in this case /test. */
  
  /* add a value (an integer) to the OscMessage */
  
  /* send the OscMessage to a remote location specified in myNetAddress */
 
}


void keyPressed() {
  OscMessage m;
  switch(key) {
    case('c'):
    /* connect to the broadcaster */
    m = new OscMessage("/server/connect", new Object[0]);
    oscP5.flush(m, myBroadcastLocation);  
    break;
    case('d'):
    /* disconnect from the broadcaster */
    m = new OscMessage("/server/disconnect", new Object[0]);
    oscP5.flush(m, myBroadcastLocation);  
    break;
  }
}


/* incoming osc message are forwarded to the oscEvent method. */
void oscEvent(OscMessage theOscMessage) {
  /* get and print the address pattern and the typetag of the received OscMessage */
  println("### received an osc message with addrpattern "+theOscMessage.addrPattern()+" and typetag "+theOscMessage.typetag());
  theOscMessage.print();
}



Nothing really, it was a tutorial, so all the steps I took were pretty much the right ones and whenever ambiguities have arisen, I sought help. Probably the right way to do it?

Although I would probably have looked into more filters, all things considered. There are quite a few available out there that could have made a larger impact on my audio and video excerpts. That having said, what I used was more than enough

I would also not use songs that are very long and heavy. I tried using some of the features I had recorded for my radio show and, considering these weigh over 80 MB and last for over an hour, no wonder Max 7 decided to rage quit on me (crash)



Hell yeah it does, I do enjoy creating a nice combination of devices all interconnected between one another. I already created for the first assignment a combination between Arduino and a Processing app running on Android, communicating via bluetooth with extra layers associated. It's always pleasant to see how your effort gets turned into impactful solutions

So how 'bout dis: Arduino with sensors <-> Processing sketch on Android <-> Max 7 and a massive amount of libraries and stuff like OpenCV, OCP/UDP communication, bluetooth communication, serial communication. Like, what, wanna tell me I can't do dis? ._.

But seriously now, there is not that much that can be achieved at the moment, beyond sound and vision. It seems that we were showcased a variety of ways of creating and manipulating such material with the addition of keeping a steady and efficient data flow



As above, however I did see a lot of value presented with this laboratory, as I see many possibilities associated. On one hand, the communication methods are quite intriguing to see, as there are a handful that may be utilised for the upcoming assignment, despite our project being a rather inclusive project, working perfectly as a standalone intervention out in the field. It's an intervention within the environment, placemaking as many see it, therefore there is no point in going into spheres that could disrupt the intervention, one sturdy and potentially enriching the user experience.



That having said, we are utilising music in our project, therefore some sort of intervention presented by users could be interesting. Just, how do we make it, we'd need an app at the very least. A device present on sight or make people have to download an external app, with which they would be able to manipulate with the environment, even lights and music if needed. Wouldn't that be going to far, though? How do the users get to know how to download the app? How sure are we the app would work? What are the limitations? Would the app work via the internet or bluetooth? How would we power each?

There are more questions than answers and I think it may be our team's belief, that the intervention should be autonomous. It is taking place at the university after all at a site that is particularly soothing for people to stay present in. Why disrupt it?